Inferring Covariances for Probabilistic Programs
نویسندگان
چکیده
We study weakest precondition reasoning about the (co)variance of outcomes and the variance of run–times of probabilistic programs with conditioning. For outcomes, we show that approximating (co)variances is computationally more difficult than approximating expected values. In particular, we prove that computing both lower and upper bounds for (co)variances is Σ 2–complete. As a consequence, neither lower nor upper bounds are computably enumerable. We therefore present invariant–based techniques that do enable enumeration of both upper and lower bounds, once appropriate invariants are found. Finally, we extend this approach to reasoning about run–time variances.
منابع مشابه
Probabilistic programs for inferring the goals of autonomous agents
Intelligent systems sometimes need to infer the probable goals of people, cars, and robots, based on partial observations of their motion. This paper introduces a class of probabilistic programs for formulating and solving these problems. The formulation uses randomized path planning algorithms as the basis for probabilistic models of the process by which autonomous agents plan to achieve their...
متن کاملLogic Program Induction using MDL and MAP: An Application to Grammars
Probabilistic programs provide an appealing language for describing mental theories, because they are Turing complete: any computable process may be described as a program. Program induction is the problem of inferring theories, in the form of (probabilistic) programs, that describe some set of observations. Minimum Description Length, or MDL, is one common approach to program induction [11]. T...
متن کاملApproximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs
The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to...
متن کاملA Provably Correct Sampler for Probabilistic Programs
We consider the problem of inferring the implicit distribution specified by a probabilistic program. A popular inference technique for probabilistic programs called Markov Chain Monte Carlo or MCMC sampling involves running the program repeatedly and generating sample values by perturbing values produced in “previous runs”. This simulates a Markov chain whose stationary distribution is the dist...
متن کاملReconstruction of sparse connectivity in neural networks from spike train covariances
The inference of causation from correlation is in general highly problematic. Correspondingly, it is difficult to infer the existence of physical synaptic connections between neurons from correlations in their activity. Covariances in neural spike trains and their relation to network structure have been the subject of intense research, both experimentally and theoretically. The influence of rec...
متن کامل